l2, 1 Regularized correntropy for robust feature selection

نویسندگان

  • Ran He
  • Tieniu Tan
  • Liang Wang
  • Wei-Shi Zheng
چکیده

In this paper, we study the problem of robust feature extraction based on l2,1 regularized correntropy in both theoretical and algorithmic manner. In theoretical part, we point out that an l2,1-norm minimization can be justified from the viewpoint of half-quadratic (HQ) optimization, which facilitates convergence study and algorithmic development. In particular, a general formulation is accordingly proposed to unify l1-norm and l2,1-norm minimization within a common framework. In algorithmic part, we propose an l2,1 regularized correntropy algorithm to extract informative features meanwhile to remove outliers from training data. A new alternate minimization algorithm is also developed to optimize the non-convex correntropy objective. In terms of face recognition, we apply the proposed method to obtain an appearance-basedmodel, called Sparse-Fisherfaces. Extensive experiments show that our method can select robust and sparse features, and outperforms several state-of-the-art subspace methods on largescale and open face recognition datasets. In the pattern recognition and computer vision community, feature selection is a fundamental and important method, which aims to select a subset of relevant features meanwhile remove irrelevant and redundant ones out of high-dimensional features. Feature selection can improve generalization capability and speed up learning process [11]. It also helps people better understand about data properties from the curse of dimensionality. In the past decades, various feature selection methods have been developed [6], among which sparsity regularization is recently considered as one of the most popular ones due to its effectiveness, robustness and efficiency. In l1-SVM (Support Vector Machine), an l1-norm regularization is incorporated in SVM to perform feature selection [2]. To form a more structured regularization, Wang et al. [16] propose a hybrid huberized SVM (HHSVM) by combining both l1-norm and l2-norm. Since HHSVM is only Figure 1. A general framework for robust feature selection. First row: for a corrupted data matrix, we alternately remove outliers and redundant features. Second row: an illustration on the PEAL dataset. If two outliers corrupted by sunglasses are removed from the dataset, features in the white box will be the most discriminative to classify different individuals. for binary classification, Argyriou et al. [1] further develop a similar l2,1 regularized model to deal with feature selection problem in multi-task learning. Recently, Nie et al. [11] propose a robust feature selection method by imposing joint l2,1-norm minimization on both loss function and regularization. A new iterative method is also proposed to efficiently optimize the l2,1-norm minimization. Based on [11], Gu et al. [5], Hou et al. [9], and Yang et al. [20] apply the joint l2,1-normminimization into subspace learning, sparse regression, and discriminative feature selection respectively. Although different methods based on l2,1-norm minimization are proposed, the relationship between the optimal procedure in [11] and other methods (such as iteratively reweighted least squares and half-quadratic optimization) remains unclear. Further theoretical analysis is thus necessary. Toward this end, this paper presents both theoretical exploration and algorithmic development on l2,1-norm minimization. First, a half-quadratic analysis is given for l2,1norm minimizations. Based on this analysis, we can easily extend an l2,1-norm loss function to other loss functions and develop new algorithms. Then we present a general frame-

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Maximum Correntropy Adaptive Filtering Approach for Robust Compressive Sensing Reconstruction

Robust compressive sensing(CS) reconstruction has become an attractive research topic in recent years. Robust CS aims to reconstruct the sparse signals under non-Gaussian(i.e. heavy tailed) noises where traditional CS reconstruction algorithms may perform very poorly due to utilizing l2 norm of the residual vector in optimization. Most of existing robust CS reconstruction algorithms are based o...

متن کامل

A Regularized Correntropy Framework for Robust Pattern Recognition

This letter proposes a new multiple linear regression model using regularized correntropy for robust pattern recognition. First, we motivate the use of correntropy to improve the robustness of the classical mean square error (MSE) criterion that is sensitive to outliers. Then an l regularization scheme is imposed on the correntropy to learn robust and sparse representations. Based on the half-q...

متن کامل

Graph Regularized Non-negative Matrix Factorization By Maximizing Correntropy

Non-negative matrix factorization (NMF) has proved effective in many clustering and classification tasks. The classic ways to measure the errors between the original and the reconstructed matrix are l2 distance or KullbackLeibler (KL) divergence. However, nonlinear cases are not properly handled when we use these error measures. As a consequence, alternative measures based on nonlinear kernels,...

متن کامل

Convex regularized recursive maximum correntropy algorithm

In this brief, a robust and sparse recursive adaptive filtering algorithm, called convex regularized recursive maximum correntropy (CR-RMC), is derived by adding a general convex regularization penalty term to the maximum correntropy criterion (MCC). An approximate expression for automatically selecting the regularization parameter is also introduced. Simulation results show that the CR-RMC can...

متن کامل

Robust Semi-supervised Learning for Biometrics

To deal with the problem of sensitivity to noise in semi-supervised learning for biometrics, this paper proposes a robust Gaussian-Laplacian Regularized (GLR) framework based on maximum correntropy criterion (MCC), called GLR-MCC, along with its convergence analysis. The half quadratic (HQ) optimization technique is used to simplify the correntropy optimization problem to a standard semi-superv...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012